AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
MS MARCO Pretraining

# MS MARCO Pretraining

Msmarco Distilbert Word2vec256k MLM 230k
This model is a pre-trained language model based on the DistilBERT architecture, initialized with a 256k vocabulary using word2vec and trained on the MS MARCO corpus with masked language modeling (MLM).
Large Language Model Transformers
M
vocab-transformers
16
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase